Explicit Approximations of the Gaussian Kernel
نویسندگان
چکیده
We investigate training and using Gaussian kernel SVMs by approximating the kernel with an explicit finitedimensional polynomial feature representation based on the Taylor expansion of the exponential. Although not as efficient as the recently-proposed random Fourier features [Rahimi and Recht, 2007] in terms of the number of features, we show how this polynomial representation can provide a better approximation in terms of the computational cost involved. This makes our “Taylor features” especially attractive for use on very large data sets, in conjunction with online or stochastic training.
منابع مشابه
Angular Smoothing and Spatial Diffusion from the Feynman Path Integral Representation of Radiative Transfer
The propagation kernel for time dependent radiative transfer is represented by a Feynman Path Integral (FPI). The FPI is approximately evaluated in the spatial-Fourier domain. Spatial diffusion is exhibited in the kernel when the approximations lead to a gaussian dependence on the Fourier domain wave vector. The approximations provide an explicit expression for the diffusion matrix. They also p...
متن کاملScalable Log Determinants for Gaussian Process Kernel Learning
For applications as varied as Bayesian neural networks, determinantal point processes, elliptical graphical models, and kernel learning for Gaussian processes (GPs), one must compute a log determinant of an n× n positive definite matrix, and its derivatives – leading to prohibitive O(n) computations. We propose novel O(n) approaches to estimating these quantities from only fast matrix vector mu...
متن کاملSpeech Enhancement Using Gaussian Mixture Models, Explicit Bayesian Estimation and Wiener Filtering
Gaussian Mixture Models (GMMs) of power spectral densities of speech and noise are used with explicit Bayesian estimations in Wiener filtering of noisy speech. No assumption is made on the nature or stationarity of the noise. No voice activity detection (VAD) or any other means is employed to estimate the input SNR. The GMM mean vectors are used to form sets of over-determined system of equatio...
متن کاملar X iv : 1 41 2 . 82 93 v 2 [ st at . M L ] 9 A ug 2 01 5 Quasi - Monte Carlo Feature Maps for Shift - Invariant Kernels ∗
We consider the problem of improving the efficiency of randomized Fourier feature maps to accelerate training and testing speed of kernel methods on large datasets. These approximate feature maps arise as Monte Carlo approximations to integral representations of shift-invariant kernel functions (e.g., Gaussian kernel). In this paper, we propose to use Quasi-Monte Carlo (QMC) approximations inst...
متن کاملGaussian Convolutions. Numerical Approximations Based on Interpolation
Gaussian convolutions are perhaps the most often used image operators in low-level computer vision tasks. Surprisingly though, there are precious few articles that describe efficient and accurate implementations of these operators. In this paper we describe numerical approximations of Gaussian convolutions based on interpolation. We start with the continuous convolution integral and use an inte...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1109.4603 شماره
صفحات -
تاریخ انتشار 2011